Recently, Professor Yu Yushu at Beijing Institute of Technology published papers in IEEE RAL, IEEE TRO, and IEEE TASE. Their research focuses on solving relative pose and global positioning problems for Integrated Aerial Platforms (IAPs), proposing a control and state estimation framework for IAPs that lays a solid foundation for flying robotic systems to perform multifunctional aerial operations. The research employs NOKOV Motion Capture System to provide high-precision ground truth data for IAPs.
IEEE RAL (2024): Multi-Agent Visual-Inertial Localization for Integrated Aerial Systems With Loose Fusion of Odometry and Kinematics
This study pioneers a multi-agent localization framework for IAP, ingeniously integrating individual UAV visual-inertial odometry data with internal kinematic constraints. This approach fully leverages the intrinsic geometric information to effectively overcome positioning drift and accuracy degradation caused by motion constraints.
A three-agent IAP and the illustration of the reference frames
The research team first derived and constructed a universal constraint formula independent of specific kinematic parameters, applicable to diverse IAP configurations, significantly enhancing the versatility and robustness of the system. Building upon this foundation, the team further developed a sliding-window optimization-based state estimator to fuse information from individual UAV visual odometers with internal IAP kinematic constraints. Through this optimization mechanism, the system estimates relative transformations between agents during motion, providing a feasible technical pathway for high-precision autonomous localization in complex environments for integrated multi-robot systems. Experimental results demonstrate that this method significantly improves positioning accuracy, with markedly reduced global positioning drift and substantially lower relative positioning errors compared to baseline methods.
NOKOV Motion Capture System provides high-precision ground truth trajectories for IAP agents and the central platform, enabling evaluation and validation of the proposed localization performance of the system.
Citation
Lai G, Shi C, Wang K, et al. Multi-agent visual-inertial localization for integrated aerial systems with loose fusion of odometry and kinematics[J]. IEEE Robotics and Automation Letters, 2024, 9(7): 6504-6511.
IEEE TASE (2025): Tight Fusion of Odometry, Kinematic Constraints, and UWB Ranging Systems for State Estimation of Integrated Aerial Platforms
This paper addresses the challenge of precise positioning for Integrated Aerial Platforms (IAPs) in complex environments. It proposes a method utilizing only onboard sensors and real UWB measurement data, enhancing the practical applicability of IAPs in real-world scenarios and offering novel insights for decentralized multi-aircraft positioning.
Environment setup during actual flight
The research innovatively integrates IAP physical constraints with Ultra-Wideband (UWB) ranging data to achieve rapid, efficient unification of multi-vehicle coordinate systems and precise estimation of anchor positions. The team formulated a decentralized optimization problem for each sub-vehicle based on position, velocity, and attitude constraints, naming it the Vision-Inertial-Range-Physical Odometry (VIRPO) algorithm. The decentralized design reduces reliance on a central processing unit, enhancing system scalability. Extensive evaluation on datasets demonstrates that the VIRPO algorithm achieves higher positioning accuracy compared to baseline methods, reducing odometer drift by 28.7% on real-world datasets. This study marks the first integration of the algorithm into a real IAP system, with flight experiments successfully validating its performance in practical applications.
NOKOV Motion Capture System provided high-precision pose data for the IAP, enabling the generation of simulated UWB measurement data to evaluate the performance of the algorithm.
Citation
Yu Y, Fan Y, Lai G, et al. Tight Fusion of Odometry, Kinematic Constraints, and UWB Ranging Systems for State Estimation of Integrated Aerial Platforms[J]. IEEE Transactions on Automation Science and Engineering, 2025.
IEEE TRO (2025): Versatile Tasks on Integrated Aerial Platforms Using Only Onboard Sensors: Control, Estimation, and Validation
Against this backdrop, Professor Yu Yushu published the paper “Versatile Tasks on Integrated Aerial Platforms Using Only Onboard Sensors: Control, Estimation, and Validation” in IEEE Transactions on Robotics and Automation (IEEE TRO), a top-tier robotics journal, in 2025. Building upon prior work, this study proposes a comprehensive control and state estimation framework designed to fully leverage the potential of IAPs for executing diverse tasks.
Snapshot of different tasks for a three-vehicle IAP
The paper introduces a universal integrated framework that combines base control, interactive control, direct force/torque control, perception-aware target observation algorithms, and motion-odometry fusion state estimation algorithms. Functionality is achieved without relying on force/torque sensors or external positioning systems, significantly enhancing the autonomy and universality of the system.
To maintain continuous target observation during motion, the study designs a vision-aware attitude correction algorithm termed Perception-Aware Model Predictive Control (PAMPC). This algorithm enables the complex dynamic system of the IAP to keep the target perpetually within its field of view. The coordinate-free, globally effective, and computationally efficient control scheme resolves the challenge of effective target perception in complex dynamic environments.
To address autonomous positioning for IAPs without external positioning systems, this paper proposes a Relative Transformation Estimation (RTE) algorithm. By loosely coupling the kinematic constraints of the sub-vehicle and central platform with Visual-Inertial Odometry (VIO) data, this algorithm significantly improves the overall global positioning accuracy of the entire platform. Experiments conducted on a real IAP prototype demonstrate for the first time the feasibility of executing multiple complex tasks solely with onboard sensors, validating the effectiveness of the proposed framework and fusion algorithms.
NOKOV Motion Capture System provided high-precision ground truth trajectories for the IAP, enabling rigorous evaluation and validation of the accuracy and efficacy of the proposed localization methods.
Citation
Wang K, Lai G, Yu Y, et al. Versatile Tasks on Integrated Aerial Platforms Using Only Onboard Sensors: Control, Estimation, and Validation[J]. IEEE Transactions on Robotics, 2025.